Convergence of Proximal-Gradient Stochastic Variational Inference under Non-Decreasing Step-Size Sequence

نویسندگان

  • Mohammad Emtiyaz Khan
  • Reza Babanezhad
  • Wu Lin
  • Mark W. Schmidt
  • Masashi Sugiyama
چکیده

Several recent works have explored stochastic gradient methods for variational inference that exploit the geometry of the variational-parameter space. However, the theoretical properties of these methods are not well-understood and these methods typically only apply to conditionallyconjugate models. We present a new stochastic method for variational inference which exploits the geometry of the variational-parameter space and also yields simple closed-form updates even for non-conjugate models. We also give a convergence-rate analysis of our method and many other previous methods which exploit the geometry of the space. Our analysis generalizes existing convergence results for stochastic mirror-descent on non-convex objectives by using a more general class of divergence functions. Beyond giving a theoretical justification for a variety of recent methods, our experiments show that new algorithms derived in this framework lead to state of the art results on a variety of problems. Further, due to its generality, we expect that our theoretical analysis could also apply to other applications.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Conjugate-Computation Variational Inference: Converting Variational Inference in Non-Conjugate Models to Inferences in Conjugate Models

Variational inference is computationally challenging in models that contain both conjugate and non-conjugate terms. Methods specifically designed for conjugate models, even though computationally efficient, find it difficult to deal with non-conjugate terms. On the other hand, stochastic-gradient methods can handle the nonconjugate terms but they usually ignore the conjugate structure of the mo...

متن کامل

Faster Stochastic Variational Inference using Proximal-Gradient Methods with General Divergence Functions

Several recent works have explored stochastic gradient methods for variational inference that exploit the geometry of the variational-parameter space. However, the theoretical properties of these methods are not well-understood and these methods typically only apply to conditionallyconjugate models. We present a new stochastic method for variational inference which exploits the geometry of the ...

متن کامل

Second-order stochastic variational inference

Stochastic gradient descent (SGD), the workhorse of stochastic optimization, is slow in theory (sub-linear convergence) and in practice (thousands of iterations), intuitively for two reasons: 1) Its learning rate schedule is fixed a priori and decays rapidly enough to 0 that is square-summable. This learning rate schedule limits the step size and hence the rate of convergence for a Lipschitz ob...

متن کامل

Stochastic Proximal Gradient Algorithms for Penalized Mixed Models

Motivated by penalized likelihood maximization in complex models, we study optimization problems where neither the function to optimize nor its gradient have an explicit expression, but its gradient can be approximated by a Monte Carlo technique. We propose a new algorithm based on a stochastic approximation of the Proximal-Gradient (PG) algorithm. This new algorithm, named Stochastic Approxima...

متن کامل

Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-\L{}ojasiewicz Condition

In 1963, Polyak proposed a simple condition that is sufficient to show a global linear convergence rate for gradient descent. This condition is a special case of the Lojasiewicz inequality proposed in the same year, and it does not require strong convexity (or even convexity). In this work, we show that this much-older PolyakLojasiewicz (PL) inequality is actually weaker than the main condition...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1511.00146  شماره 

صفحات  -

تاریخ انتشار 2015